Averaged Conservative Boosting: Introducing a New Method to Build Ensembles of Neural Networks
نویسندگان
چکیده
In this paper, a new algorithm called Averaged Conservative Boosting (ACB) is presented to build ensembles of neural networks. In ACB we mix the improvements that Averaged Boosting (Aveboost) and Conservative Boosting (Conserboost) made to Adaptive Boosting (Adaboost). In the algorithm we propose we have applied the conservative equation used in Conserboost along with the averaged procedure used in Aveboost in order to update the sampling distribution used in the training of Adaboost. We have tested the methods with seven databases from the UCI repository. The results show that the best results are provided by our method, Averaged Conservative Boosting.
منابع مشابه
Ensemble strategies to build neural network to facilitate decision making
There are three major strategies to form neural network ensembles. The simplest one is the Cross Validation strategy in which all members are trained with the same training data. Bagging and boosting strategies pro-duce perturbed sample from training data. This paper provides an ideal model based on two important factors: activation function and number of neurons in the hidden layer and based u...
متن کاملDecision Fusion on Boosting Ensembles
Training an ensemble of neural networks is an interesting way to build a Multi-net System. One of the key factors to design an ensemble is how to combine the networks to give a single output. Although there are some important methods to build ensembles, Boosting is one of the most important ones. Most of methods based on Boosting use an specific combiner (Boosting Combiner). Although the Boosti...
متن کاملA Case Study on Bagging, Boosting, and Basic Ensembles of Neural Networks for OCR
W e study the effectiveness of three neural network ensembles in improving OCR performance: ( i ) Basic, (ii) Bagging, and (iii) Boosting. Three random character degradation models are introduced in training indivadual networks in order to reduce error correlation between individual networks and to improve the generalization ability of neural networks. We compare the recognition accuracies of t...
متن کاملClass-switching neural network ensembles
This article investigates the properties of class-switching ensembles composed of neural networks and compares them to class-switching ensembles of decision trees and to standard ensemble learning methods, such as bagging and boosting. In a class-switching ensemble, each learner is constructed using a modified version of the training data. This modification consists in switching the class label...
متن کاملPopular Ensemble Methods: An Empirical Study
An ensemble consists of a set of individually trained classifiers (such as neural networks or decision trees) whose predictions are combined when classifying novel instances. Previous research has shown that an ensemble is often more accurate than any of the single classifiers in the ensemble. Bagging (Breiman, 1996c) and Boosting (Freund & Schapire, 1996; Schapire, 1990) are two relatively new...
متن کامل